Crowdsourcing for Usability Testing

نویسندگان

  • Di Liu
  • Matthew Lease
  • Rebecca Kuipers
  • Randolph G. Bias
چکیده

While usability evaluation is critical to designing usable websites, traditional usability testing can be both expensive and time consuming. The advent of crowdsourcing platforms such as Amazon Mechanical Turk and CrowdFlower offer an intriguing new avenue for performing remote usability testing with potentially many users, quick turn-around, and significant cost savings. To investigate the potential of such crowdsourced usability testing, we conducted two similar (though not completely parallel) usability studies which evaluated a graduate school’s website: one via a traditional usability lab setting, and the other using crowdsourcing. While we find crowdsourcing exhibits some notable limitations in comparison to the traditional lab environment, its applicability and value for usability testing is clearly evidenced. We discuss both methodological differences for crowdsourced usability testing, as well as empirical contrasts to results from more traditional, face-to-face usability testing.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

To Permit or Not to Permit, That is the Usability Question: Crowdsourcing Mobile Apps' Privacy Permission Settings

Millions of apps available to smartphone owners request various permissions to resources on the devices including sensitive data such as location and contact information. Disabling permissions for sensitive resources could improve privacy but can also impact the usability of apps in ways users may not be able to predict. We study an efficient approach that ascertains the impact of disabling per...

متن کامل

Crowdsourced Web Site Evaluation with CrowdStudy

Many different automatic usability evaluation tools have been specifically developed for web sites and web-based services, but they usually cannot replace user testing. At the same time, traditional usability evaluation methods can be both expensive and time consuming. We will demonstrate CrowdStudy, a toolkit for crowdsourced testing of web interfaces that allows, not only to efficiently recru...

متن کامل

The Robot Management System: A Framework for Conducting Human-Robot Interaction Studies Through Crowdsourcing

Human-Robot Interaction (HRI) is a rapidly expanding field of study that focuses on allowing nonroboticist users to naturally and effectively interact with robots. The importance of conducting extensive user studies has become a fundamental component of HRI research; however, due to the nature of robotics research, such studies often become expensive, time consuming, and limited to constrained ...

متن کامل

Xamobile: Usability Evaluation of Text Input Methods on Mobile Devices for Historical African Languages

Customized text input editors on mobile devices for languages with no standard language models, such as some African languages, are vital to allow text input tasks to be crowdsourced and thus enable quick and precise participation. We investigated 4 different mobile input techniques for complex language scripts like |Xam and collected accuracy data from experiments with the Xwerty, T9, Pinyin s...

متن کامل

Crowdsourcing to Mobile Users: A Study of the Role of Platforms and Tasks

We study whether the task currently proposed on crowdsourcing platforms are adequate to mobile devices. We aim at understanding both (i) which crowdsourcing platforms, among the existing ones, are more adequate to mobile devices, and (ii) which kinds of tasks are more adequate to mobile devices. Results of a user study hint that: some crowdsourcing platforms seem more adequate to mobile devices...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1203.1468  شماره 

صفحات  -

تاریخ انتشار 2012